Health care is undoubtedly approaching a critical inflection point. Clinicians across the country are questioning whether they can remain in a profession in which so much is expected of them, yet policies, resources, and infrastructure are not aligned to allow them to perform their best work in safe and sustainable ways. Clinicians must constantly adapt to unnecessarily complex information systems and cumbersome workflows to provide the best care for their patients. Instead of this inefficient and ineffective environment, we need systems designed for both human capacity and human limitations. In this commentary, we propose that designing for human capacity by integrating human factors science (incorporating understanding of behaviors, proficiency, and limitations in system design) and a safety II framework (viewing human creativity as an essential safety safeguard rather than the root cause of error) is essential to the future of health care safety and the sustainability of its workforce. Doing so would not only improve health care system safety and resilience but also recenter relationships and clinician expertise in health care design and delivery. The traditional approach to safety, termed safety I,1Hollnagel E.W. Wears R.L. Braithwaite J. From safety-I to safety-II: a white paper. The resilient health care net: published simultaneously by the University of Southern Denmark, University of Florida, USA, and Macquarie University, Australia.https://www.england.nhs.uk/signuptosafety/wp-content/uploads/sites/16/2015/10/safety-1-safety-2-whte-papr.pdfDate: 2015Date accessed: May 22, 2023Google Scholar focuses on identifying and eliminating errors by isolating the root cause. Humans are often seen as the root cause: a problem to be fixed and a source of liability and hazard to be protected against. This approach inadvertently sends clinicians the message that they are not to be trusted—that they are the broken ones. It also designs rigid systems that result in unnecessary cognitive load on participants because they seek to ensure safety for the unique circumstances of individual patients. In complex systems, such as health care work environments, human judgment and adaptability are essential for patient safety. The safety II approach emphasizes humans as the resource necessary for system flexibility, safety, and resilience. Rather than being seen as the primary threat to safety, humans are seen as necessary to adapt the system to the needs of individual patients.1Hollnagel E.W. Wears R.L. Braithwaite J. From safety-I to safety-II: a white paper. The resilient health care net: published simultaneously by the University of Southern Denmark, University of Florida, USA, and Macquarie University, Australia.https://www.england.nhs.uk/signuptosafety/wp-content/uploads/sites/16/2015/10/safety-1-safety-2-whte-papr.pdfDate: 2015Date accessed: May 22, 2023Google Scholar Clinicians are given the agency and authority to flex standardized processes to meet the nuanced needs of individual patients and families, the very work they trained for. Systems are explicitly designed to draw on human skill and creativity while limiting unnecessary cognitive burden and underutilization of training and talent. This allows clinicians to play their essential role of customizing care to the unique needs of individual patients, relationship building, and healing. Safety II is a systems-based approach that views safety not as the absence of error but as the presence of resilience. It is based on the understanding that errors can and will always occur, no matter how well a system is designed. Rather than attempting to eliminate all errors, we should instead focus on creating systems that are able to not only anticipate and avoid error but also adapt and recover from errors in a way that supports and leverages human capacity. Human factors science incorporates an understanding of human behaviors, proficiency, and limitations into design, with the goal of creating safer, more intuitive, efficient, and user-friendly systems, technologies, and processes.2Russ A.L. Fairbanks R.J. Karsh B.T. Militello L.G. Saleem J.J. Wears R.L. The science of human factors: separating fact from fiction.BMJ Qual Saf. 2013; 22: 802-808https://doi.org/10.1136/bmjqs-2012-001450Crossref PubMed Scopus (150) Google Scholar When systems are not designed with human capacity in mind, the result can be a paradoxical increase in safety hazards, higher rates of burnout, and, ultimately, an exodus of health care workers that puts the entire system at risk. For example, many health care systems have established a policy of copying a patient’s primary care physician with every test ordered by other physicians and in other settings, flooding the primary care physician’s inbox, causing both information overload and ambiguity about responsibility for follow-up. Both outcomes represent major safety hazards. A human factors approach recognizes that such attempts at “safety” through redundancy can backfire. A safer approach is to have clear policies, such as “you order it, you own it” and thoughtful information flows, such as explicitly not automatically copying physicians other than the ordering physician with test results and then requiring that the ordering physicians determine the need to notify another physician on a case-by-case basis. For clarity, some organizations require a unique note indicating why a result is being copied to another physician. In addition, inbox volume is one of the primary determinants of “work outside of work”, also known as “pajama time,” personal time physicians spend on their off hours working on the electronic health record. Those in the highest quartile for work outside of work experience 11-fold higher odds of burnout than those in the lowest quartile.3Adler-Milstein J. Zhao W. Willard-Grace R. Knox M. Grumbach K. Electronic health records and burnout: time spent on the electronic health record after hours and message volume associated with exhaustion but not with cynicism among primary care clinicians.J Am Med Inform Assoc. 2020; 27: 531-538Crossref PubMed Scopus (103) Google Scholar Similarly, physicians in the top quartile of inbox message volume experience a 6-fold increase in odds of burnout compared with those in the bottom quartile. Physicians who experience burnout are twice as likely to leave their organization within 2 years as those without burnout.4Hamidi M.S. Bohman B. Sandborg C. et al.Estimating institutional physician turnover attributable to self-reported burnout and associated financial burden: a case study.BMC Health Serv Res. 2018; 18: 851Crossref PubMed Scopus (114) Google Scholar Focusing the goal of improvement efforts on designing systems that support the strengths of human attention and connection rather than well-intentioned improvement efforts that fail to understand how redundancy overloads health care workers will lead to a safer, more resilient system. Designing for both human capacity (safety II) and for human limitations (human factors engineering) is necessary to overcome these challenges. Much of today’s hazardous and haphazard health care work environments are the result of a series of disconnected design decisions that failed to consider both human capacity and human limitations—people with all of their messy and magnificent abilities and shortcomings and people who need trust and a manageable workload to succeed. What might this integration look like? First, all clinical and operational leaders adopt the safety II mindset that clinicians with their training and creativity are the essential system safety safeguards whose time and talent need to be protected from “stupid stuff”5Ashton M. Getting rid of stupid stuff.N Engl J Med. 2018; 379: 1789-1791Crossref PubMed Scopus (56) Google Scholar to function effectively. This mindset leads them to invest in standardization of predictable work that reduces extraneous cognitive load and decision fatigue.6Harry E. Pierce R.G. Kneeland P. Huang G. Stein J. Sweller J. Cognitive load and its implications for health care.NEJM Catal. 2018; 4https://catalyst.nejm.org/doi/full/10.1056/CAT.18.0233Date accessed: May 22, 2023Google Scholar Next, clinicians and human factors engineers collaborate to design reasonable safeguards such that human beings are protected from making unnecessary errors. Finally, although these safeguards may at times present hard stops to prevent an egregious error, most of the safeguards would be gentle nudges. Furthermore, safety will be achieved by systems that are designed to reduce the non–value-added work, to allow more time for relationship building, and to facilitate customization of care to the unique needs of individual patients and their circumstances. This could be operationalized using the adapted organizational accident causation model by Taylor-Adams and Vincent,7Taylor-Adams S. Vincent C. Systems Analysis of Clinical Incidents. The London Protocol. Clinical Safety Research Unit, Imperial College London, 2004Crossref Scopus (95) Google Scholar which breaks down the contributory components on a path to incident that has been useful in illustrating system-wide thinking going upstream to leadership and management decisions that have downstream effects. The study by Pasmore et al8Pasmore W. Winby S. Mohrman S.A. Vanasse R. Reflections: sociotechnical systems design and organization change.J Change Manage. 2019; 19: 67-85Crossref Scopus (76) Google Scholar on sociotechnical system design for effective organizational systems further develops the system thinking foundation of safety II regarding systemic contributors to doing things right: (1) joint optimization of people and technology (critically important now to balance an emphasis on technology solutions to systemic issues) and (2) “environmental sensors” to give needed feedback to senior leadership as to how things are going in organizational efficacy to achieve its intended mission. In health care, these sensors are both patients and clinicians. We have patient experience feedback to leadership but need clinician experience feedback to give data to decision makers for the required adjustments to do patient care effectively. Overall, this represents a shift in culture and leadership, such that clinicians are viewed, empowered, and resourced by leaders to create, criticize, and continuously improve system safety standards. It also requires an augmentation of personnel, by engaging human factors engineers who can help to measure and manage the cognitive load and other human factors of system design to create resilience to unanticipated events.9Catchpole K. Bowie P. Fouquet S. Rivera J. Hignett S. Frontiers in human factors: embedding specialists in multi-disciplinary efforts to improve healthcare.Int J Qual Health Care. 2021; 33: 13-18Crossref PubMed Scopus (8) Google Scholar Finally, it requires an approach to information technology purchase, adaptation, and integration that draws on the expertise of both clinical and human factors experts to prevent regulatory or reimbursement considerations from overwhelming the safety and relational focus of clinicians with their patients and team members. The stress of the coronavirus disease 2019 pandemic exacerbated and drew attention to unsafe clinical environments, work overload, and burnout that have been endemic in health care for decades. Health care has reached a crisis point where action must be taken to protect the resilience of the system and its most critical resources for safety and healing—health care team members.